hinge function
Positivity sets of hinge functions
Schicho, Josef, Tewari, Ayush Kumar, Warren, Audie
In this paper we investigate which subsets of the real plane are realisable as the set of points on which a one-layer ReLU neural network takes a positive value. In the case of cones we give a full characterisation of such sets. Furthermore, we give a necessary condition for any subset of $\mathbb R^d$. We give various examples of such one-layer neural networks.
MARS: Multivariate Adaptive Regression Splines -- How to Improve on Linear Regression?
Machine Learning is making huge leaps forward, with an increasing number of algorithms enabling us to solve complex real-world problems. This story is part of a deep dive series explaining the mechanics of Machine Learning algorithms. In addition to giving you an understanding of how ML algorithms work, it also provides you with Python examples to build your own ML models. Before we dive into the specifics of MARS, I assume that you are already familiar with Linear Regression. Looking at the algorithm's full name -- Multivariate Adaptive Regression Splines -- you would be correct to guess that MARS belongs to the group of regression algorithms used to predict continuous (numerical) target variables.
Dense Morphological Network: An Universal Function Approximator
Mondal, Ranjan, Santra, Sanchayan, Chanda, Bhabatosh
Artificial neural networks are built on the basic operation of linear combination and non-linear activation function. Theoretically this structure can approximate any continuous function with three layer architecture. But in practice learning the parameters of such network can be hard. Also the choice of activation function can greatly impact the performance of the network. In this paper we are proposing to replace the basic linear combination operation with non-linear operations that do away with the need of additional non-linear activation function. To this end we are proposing the use of elementary morphological operations (dilation and erosion) as the basic operation in neurons. We show that these networks (Denoted as DenMo-Net) with morphological operations can approximate any smooth function requiring less number of parameters than what is necessary for normal neural networks. The results show that our network perform favorably when compared with similar structured network.
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
- Asia > India > West Bengal > Kolkata (0.04)